On Nonconvex Optimization for Machine Learning

نویسندگان

چکیده

Gradient descent (GD) and stochastic gradient (SGD) are the workhorses of large-scale machine learning. While classical theory focused on analyzing performance these methods in convex optimization problems, most notable successes learning have involved nonconvex optimization, a gap has arisen between practice. Indeed, traditional analyses GD SGD show that both algorithms converge to stationary points efficiently. But do not take into account possibility converging saddle points. More recent shown can avoid points, but dependence dimension is polynomial. For modern learning, where be millions, such would catastrophic. We analyze perturbed versions they truly efficient—their only polylogarithmic. second-order essentially same time as first-order

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A SMART STOCHASTIC ALGORITHM FOR NONCONVEX OPTIMIZATION A SMART Stochastic Algorithm for Nonconvex Optimization with Applications to Robust Machine Learning

Machine learning theory typically assumes that training data is unbiased and not adversarially generated. When real training data deviates from these assumptions, trained models make erroneous predictions, sometimes with disastrous effects. Robust losses, such as the huber norm, were designed to mitigate the effects of such contaminated data, but they are limited to the regression context. In t...

متن کامل

Nonconvex Optimization Using a Fokker-planck Learning Machine

A new algorithm for nonconvex optimization by means of a so-called Fokker-Planck Learning Machine is proposed in this paper. This is done by considering the Fokker-Planck (FP) equation related to continuous simulated annealing, which has been proven to convergence to the global optimum under certain conditions. An approximate solution to the FP equation is sought by parametrizing the transition...

متن کامل

Optimization for Machine Learning

This is a draft containing only sra chapter.tex and an abbreviated front matter. Please check that the formatting and small changes have been performed correctly. Please verify the affiliation. Please use this version for sending us future modifications.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of the ACM

سال: 2021

ISSN: ['0004-5411', '1557-735X']

DOI: https://doi.org/10.1145/3418526